![]() DEVICE FOR SIGNALING OBJECTS TO A NAVIGATION MODULE OF A VEHICLE EQUIPPED WITH SAID DEVICE
专利摘要:
The device for real-time signaling of at least one object to a vehicle navigation module (60) comprises a first sensor (1) arranged to produce first sensor data having a captured first object position and a first speed compared to the vehicle. In addition, the device comprises: - at least one second sensor (2, 3, 4) arranged to produce second sensor data comprising a second position and a second speed captured object with respect to the vehicle; a synchronization module (20) arranged to produce synchronized data comprising a first synchronized position from the first captured position and the first speed and at least a second synchronized position from the second and second captured positions; ; a fusion module (30) arranged to produce merged data comprising a merged position from the first synchronized position and the second synchronized position so as to signal said at least one object to the navigation module (60) by communicating to it all or part of the merged data. 公开号:FR3020616A1 申请号:FR1453910 申请日:2014-04-30 公开日:2015-11-06 发明作者:Javier Ibanez-Guzman;Vincent Fremont;Stephane Bonnet;Miranda Neto Arthur De 申请人:Renault SAS; IPC主号:
专利说明:
[0001] The invention relates to a device for signaling one or more objects to a vehicle navigation module and to a vehicle equipped with such a device for signaling objects to a vehicle navigation module and vehicle equipped with this device. device, in particular an autonomous motor vehicle. Signaling objects, including objects constituting actual or potential obstacles, aims to provide an autonomous mobile system, a perception of the environment, natural or constrained, in which it evolves. [0002] This perception is obtained by means of data processing acquired by sensors of the video, lidar and / or radar type, or even ultrasound, in particular by constructing a local and dynamic digital map of this environment from distances to the different objects of the scene and their speeds. This map serves as a basic knowledge for planning and implementing appropriate automatic maneuvers. For such autonomous vehicles, the perception process is the first task to be done before the decision and the action. It provides a specific representation of the environment and its own state, extracting and integrating over time key properties from sensor data, to be understandable by a computer. The perception task performs two functions: it allows to detect the various objects moving or not in its immediate environment, and to estimate their states, such as their position vis-à-vis the vehicle, their speed, their direction, their size, etc. The perception thus serves as input to the trajectory generation module and can be used to predict possible collisions and to perform obstacle avoidance. Perception is considered the most important source of information for autonomous vehicles. In recent years there are on the market so-called "smart" sensors that are increasingly used in assisted braking (ABS). These intelligent sensors not only detect objects, but can also characterize them. Despite this, they are not perfect, hence the need to improve the performance of perception systems by combining the use of complementary sensors. The use of several sensors also makes it possible to introduce the principle of redundancy in order to reach more robust perception systems; indispensable for autonomous navigation. In these multi-sensor data fusion approaches, the goal is to combine information from different sensors to increase the reliability, accuracy and completeness of a perceived situation. The output of such a system is a list of objects with their attributes and an integrity measure. In a perception task, integrity can be defined as the ability to provide confidence information while considering noise (random errors), uncertain and outliers. Most perception approaches are generic and therefore suffer from a problem of optimality as to the intended application and do not provide high level information for planning support for autonomous navigation. Through this new technical solution, we propose to improve the integrity of multi-sensor perception and to help decision-making for autonomous navigation through the extraction and taking into account of contextual information and semantics from the observed scene. Document EP0928428B1 discloses, for example, a method for evaluating the measurement quality of a distance measuring sensor on an autonomous mobile system. It evaluates the measurement quality with respect to a cell as a function of the number of other measurement sensors that reach the same state of occupancy of the cell as the evaluated sensor, quality measured all the more than the number of sensors. confirming the occupancy status is high. However, the method disclosed has many disadvantages to its use in a signaling device of one or more objects to a vehicle navigation module. Among these drawbacks it may be noted that the need for cells which is related to measurement uncertainties on objects that overlap the cell separation grid, in particular for moving objects moving from one cell to another between two measurements of sensors. One of the problems posed here is then to be able to get rid of a partitioning grid of the environment in cells. Another problem is that of having enough data to best report each object to a navigation module. More generally, another problem still exists in the prior art, which is that of improving the detection and exploitation performance of the detection beyond a simple quality evaluation. In order to remedy the problems of the prior art, the subject of the invention is a device for signaling at least one object to a vehicle navigation module comprising a first sensor arranged to produce first sensor data comprising a sensor. first position and a first speed captured from the object relative to the vehicle, characterized in that it comprises: at least one second sensor arranged to produce second sensor data having a second position and a second speed captured from the object in relation to the vehicle; a synchronization module arranged to produce synchronized data having a first synchronized position from the first captured position and first velocity and at least a second synchronized position from the captured second and second velocities; a merging module arranged to produce merged data having a merged position from the first synchronized position and the second synchronized position to signal said at least one object to the navigation module by communicating said merged data thereto. [0003] Advantageously, the device comprises a risk assessment module arranged to complete the merged data by a risk indicator associated with each object reported and generated from the merged data. Specifically, the risk assessment module comprises a set of score calculation blocks from one or more merged data and a score combination block calculated by all or part of the set of score calculation blocks. way to generate the risk indicator of each object by combination of scores. [0004] More particularly, said set comprises a distance score calculation block, an angle score calculation block, a speed score calculation block and / or an object type score calculation block. More particularly, said set includes at least one score calculation block from furthermore cartographic data. More particularly, said combination block is arranged to combine the scores by weighted summation of all or part of the calculated scores. Usefully, the fusion module is arranged to merge at least one additional position synchronized to said merged position from the first synchronized position and the second synchronized position. Advantageously, the synchronized data accessed by the merger module further comprises a position variance associated with each synchronized position so as to sum the synchronized positions, weighted by the associated variances to merge the synchronized positions. The synchronized data also advantageously comprises a position variance associated with each synchronized position, so that the fusion module is arranged to produce merged data comprising a merged position from the first synchronized position and the second synchronized position of the synchronized position. to report the same object to the navigation module by communicating said merged data when a Mahalanobis distance between said synchronized positions is less than a predetermined threshold and not to produce merged data having a merged position from the first position synchronized and the second synchronized position to signal two different objects to the navigation module when said Mahalanobis distance is greater than the predetermined threshold. The invention also relates to a motor vehicle characterized in that it comprises a device according to the invention. Other features and advantages of the invention will appear during the reading of the detailed description which follows for the understanding of which reference will be made to the appended drawings in which: FIG. 1 is a diagram which shows a vehicle equipped with the signaling device according to the invention; - Figures 2 and 3 are schematic views of the environment of the vehicle to explain the utility of the device which it is equipped; FIG. 4 is a device diagram according to the invention; FIG. 5 is a timing module diagram of the device of FIG. 4; FIG. 6 is a diagram of a fusion module of the device of FIG. 4; FIGS. 7 to 10 are risk factor score calculation block diagrams; - Figure 11 is a block diagram of combination of scores to generate a risk indicator - Figure 12 shows process steps according to the invention. In the remainder of the description, elements having an identical structure or similar functions will be designated by the same reference number. [0005] Figure 1 shows a vehicle 12 conventionally equipped with front wheels 21, rear wheels 22 and a steering wheel 9 for steering the front wheels 21 through a steering column 8. A power train 11 , electric, thermal or hybrid, is controlled acceleration and deceleration by an onboard electronic system 10, also programmed also to control on the steering column 8, a steering angle controlled from the steering wheel 9 in manual mode or from a navigation module 60 in autonomous mode. The navigation module 60 comprises, in a manner known per se, a trajectory controller such as, for example, but not necessarily that described in the document WO2014 / 009631. The on-board electronic system 10 is moreover connected directly or indirectly to means of action on the vehicle 12, in particular mechanical braking means, known moreover, on the wheels 21, 22 in the event of risk of percussion by the detected vehicle. Student. The on-board electronic system 10 is connected to sensors 1, 2, 3, 4, without limiting the quantity of sensors to four, preferably installed on the vehicle to process information received on a vehicle environment by means of processing modules digital as explained now. [0006] FIG. 2 shows a possible environment of the vehicle 12, occupied by objects 13, 14, 15 which may constitute fixed or mobile obstacles whose navigation module 60 must take into account to control a movement of the vehicle 12 in this environment without risking to hit one of the objects. The sensor 1 is for example but not necessarily a radar perception sensor whose perception field is shown in phantom lines towards the front of the vehicle. The sensor 2 is for example but not necessarily a video camera whose field of perception is represented in dashed lines towards the front of the vehicle. The use of sensors of different technologies is interesting for many reasons. For example, the video camera can make it possible to distinguish more precisely the type of each object, pedestrian type for each of the objects 13 and 14, front vehicle type for the object 15, type track center materialized by the staple marking strips , provided that the lighting conditions and visibility allow. If the type distinction is less for the radar perception sensor, the latter allows detection less subject to lighting conditions and visibility. Other types of sensors can be used, for example, purely illustrative and non-exhaustive, the type of video camera stereo, lidar, ultrasound or infra-red. At least two of the sensors used are smart sensors. [0007] A sensor is said to be intelligent in that it is arranged to perform a signal processing that it captures, so as to detect the objects 13, 14, 15 and for each detected object, to produce sensor data that includes a position and a speed of the object as captured in a reference relative to the sensor. Advantageously, it is also possible to provide a sensor parameterization so as to bring back the position and velocity coordinates obtained in the reference relative to the sensor 1, 2, in position x, y coordinates and Vx, Vy speed retranscribed in a relative reference to the vehicle 12, this for example taking into account the location of installation of the sensor on the vehicle. The data produced by each intelligent sensor of index i, are preferably structured in a list Lci of detected objects 01, Ok, ON, comprising in the header a number N of the objects detected by the sensor, a timestamp code tsp and an indicator Th latency To produce the timestamp code tsp, at the beginning of the list Lci, the sensor can access a universal clock, for example a GNSS (Global Navigation Satellite System) clock currently available in the navigation module 60. The sensor 1, 2 then enters in the tsp code the instant at which it produces the Lci list (Lc1 for the sensor 1, Lc2 for the sensor 2). The sensor 1, 2 simultaneously records in the latency indicator th the time which separates the real time capture time of the signal from the production instant of the list Lci, in other words the time consumed to process the signal. signal so as to produce the Lci list. The latency indicator t1 can also take into account the speed of the perceived signal itself. We know, for example, that the speed of sound is much less than that of light. As explained in the following description, the invention can also be implemented with sensors that do not produce timestamp code tsp, in the list Lci header. For each object listed 01, Ok, ON in the list Lci, is associated a sub-list of attributes including an object identifier Id, a position of the object indicated by its position coordinates x, y, a speed of displacement of the object marked by its velocity coordinates Vx, Vy, a position uncertainty evaluated by statistical variances 62x, 02y, an object type (pedestrian for object 13 or 14, vehicle for object 15) and a dimension of the object, such that these data are captured by the index sensor i which produces the list Lci, i varying from 1 to Q where Q represents the number of sensors. Thus, at each of the objects 13, 14, 15 which are in front of the vehicle both in the field of perception of the sensor 1 and in that of the sensor 2, correspond two sub-lists of attributes, one associated with the object as listed in the list Lc1 of the sensor 1 and the other associated with the same object as it is listed in the list Lc2 of the sensor 2. Figure 3 shows another possible environment of the vehicle 12, occupied by the objects 13, 14, 15. The environment shown in Figure 3 is relative to that of a crossroads where objects can come from the right as the object 13 pedestrian type that is about to cross the road than from the left as the pedestrian-type object 14 which is also about to cross the road. The sensor 3, for example but not necessarily a radar perception sensor or a video camera, covers a perception field shown in intermittent mixed lines towards the right front of the vehicle. The sensor 4, for example but not necessarily a sensor of the same type as the sensor 3, covers a perception field shown in intermittent broken lines towards the front left of the vehicle. [0008] Thus, to the object 13 which is on the right at the front of the vehicle both in the field of perception of the sensor 1 and in that of the sensor 3, correspond two sub-lists of attributes, the associated one to the object 13 as it is listed in the list Lc1 of the sensor 1 and the other associated with the object 13 as it is listed in the list Lc3 of the sensor 3. Likewise, to the object 14 which is on the left at the front of the vehicle both in the field of perception of the sensor 1 and in that of the sensor 4, correspond to two sub-lists of attributes, one associated with the object 14 such that it is listed in the list Lc1 of the sensor 1 and the other associated with the object 14 as it is listed in the list Lc4 of the sensor 4. At the object 15 which is on the left plus to the before the vehicle both in the field of perception of the sensor 1, in that of the sensor 2 and in that of the sensor 4, correspond to more than two sub-lists of attributes, each assumes associated with the object 15 as it is listed in each of the lists Lc1 of the sensor 1, Lc2 of the sensor 2 and Lc4 of the sensor 4. Interestingly, it is possible to provide sensors in an amount sufficient to obtain perception which cover the entire periphery of the vehicle 12 with superimpositions of field sectors at least two by two. It will therefore be understood that the number Q of sensors can exceed four and that it is of interest as soon as it is equal to two. The sensors 1, 2, 3, 4 are connected to the device illustrated by the diagram of Figure 4, directly by wire connection or local bus among the known types of CAN bus, LIN, Ethernet automotive or other, or possibly via a module external communication to push the invention to use fixed sensors outside the vehicle. In this way, the structured sensor data, in the embodiment here presented purely for illustrative purposes by each of the lists named Lc1, Lc2, Lc3, Lc4 each produced by one of the sensors labeled 1, 2, 3, 4, are communicated to a synchronization module 20. [0009] FIG. 5 makes it possible to explain in more detail the structure and operation of the synchronization module. As seen above, each of the Q sensors is arranged to produce data, structured here by a list Lci, i varying from 1 to Q. The list Lci lists a number N of objects Ok, k varying from 1 to N. Note that the number N of objects, may be different from one list Lci to another. Thus, in the example of FIG. 3, N is three for the list Lc1 which lists the objects 13, 14, 15, N is one for the list Lc2 which lists the single object 15, N is one for the list Lc3 which lists the only object 13, N is two for the list Lc4 which lists the objects 14 and 15. [0010] As seen in FIG. 5, for each object Ok in the list Lci, there is associated a sub-list of attributes comprising a position and a speed of the object Ok captured by the index sensor i, the position being quantified by its coordinates x, y and the speed being quantized by its coordinates Vx, Vy in a reference relative to the vehicle 12. The data produced by a sensor are asynchronous to the data produced by another sensor because the capture times are not necessarily synchronous from one sensor to another, the signal processing time, in particular for distinguishing the objects and quantifying the attributes of each object, varies from one sensor to another, and to a lesser extent, the speed itself of the signal, different from one sensor technology to another, can cause a delay between the moment when the signal comes from the object and the moment when the signal is perceived by the sensor. The supplier of each sensor is generally able to determine a latency indicator t1 which groups the processing time, if necessary the delay related to the speed of the signal and possibly other delays easily assessable according to the knowledge of the technology. the sensor owned by the supplier. The value of the test latency indicator according to the technology employed, fixed in the memory of the sensor by the supplier or calculated in real time by the sensor itself in a manner known per se in the technological field of the dedicated processors, according to its processing load. Due to the possible movement of the object 13, 14, 15 and / or the vehicle 12, the position of the object may have varied between the instant indicated by the timestamp code tsp, which is the one at which the sensor data is produced, and the earlier instant tsp-t, at which the object actually had the position captured by the sensor. The synchronization module 20 comprises a submodule 24 arranged to calculate a delay At which separates a synchronization instant tsyn and the previous instant tsp-t, at which the objects detected by the index sensor i, each effectively had the position captured by the sensor indicated in the associated attribute sub-list. In order to share the same synchronization instant tsyn with all the sensors concerned, the sub-module 24 accesses a reference clock 23, for example of the GNSS (acronym for the English Global Navigation Satellite System) which gives a universal reference time. Because it is specific to each index sensor i, the value of the latency indicator t, at the beginning of the list Lci, comes from the sensor itself. [0011] When the sensor installed on the vehicle is directly connected to the module 20 by wired link or LIN real-time network or CAN type with high priority level assigned to the frames transmitted by the sensors, in other words when can consider that the moment of production of the data by the sensor is comparable to the time of reception by the module 20, in particular by the submodule 24, it is possible to introduce in the list Lci the timestamp code tsp directly into input of the sub-module 24 which benefits from the reference clock 23. This makes it possible to use simpler sensors which then do not need to transmit the timestamp code tsp and therefore no need to access a clock giving a universal reference time. When the sensor installed on the vehicle is not directly connected to the module 20 by wired connection, for example by real-time network type Ethernet 802.3 twisted pair or CAN type with low priority assigned to the frames emitted by the sensors or, for example, by a remote communication network for sensors outside the vehicle, in other words when it can be considered that the moment of production of the data by the sensor is considerably earlier than the moment of reception by the receiver. module 20, in particular by the submodule 24, it is preferable to enter in the list Lci the timestamp code tsp directly at the output of the sensor or at the output of a proximity system of the sensor, arranged one or the other to issue the timestamp code tsp by accessing a clock giving the universal reference time. [0012] In order to remedy the synchronization differences related to the diversity of the types of sensors, to the signal and data transmission times or to the processing load of the module itself, the submodule 24 is preferably arranged to verify a delay tsyn-tsp which separates the instant of synchronization tsyn and the previous instant tsp at which the data were produced by the index sensor i for processing by the submodule 24. ..syrrsp If the delay tt is greater than or equal to a threshold of obsolescence of track tTF and, data in relation to the production date which is worth tsp, are not taken into account and sub-module 24 does not generate list Lsi to list data synchronized from the received data. sp If the delay tt is less than the threshold of obsolescence of track tTF, the sub-module 24 uses the delay At to produce structured synchronized data in the example illustrated by FIG. 5, by means of lists Lsi of synchronized objects . Each list Lsi remains associated with the index sensor i, i being for example 1, 2, 3, 4 for the sensor 1, 2, 3, 4 respectively. While the values of tsp and t, may be different for each list Here, all Lsi lists contain the same tsyn value indicating the synchronization instant common to all Lsi lists. Each list Lsi can contain a number N of synchronized objects, equal to the number N contained in the corresponding list Lci. [0013] For each object Ok listed in the list Lsi considered, is associated a sub-list of attributes comprising a position and a speed of the Ok object detected by the index sensor i. The position is quantified here by its coordinates xs, ys estimated at the instant tsyn common to all the objects and for the same object, common to all the sensors having detected this object. The sub-module 24 communicates the calculated delay At to a submodule 25 which accesses the position and the speed captured in the sub-list of attributes corresponding to the Ok object listed in the list of detected objects Lci of how to compute the synchronized position of the object Ok by the following formulas: xs: = x + Vx - At ys: = y + Vy - At In the sub-module output sub-list 25, the velocity the object Ok can remain quantized by its coordinates Vx, Vy to the extent that the speed of each object can be considered constant during the delay At. The data produced by the module 20 are then synchronous for all the sensors. Each list Lsi listing synchronized data produced by the module 20, merely illustrative lists Ls1, Ls2, Ls3, Ls4 of Figure 4, are transmitted to a fusion module 30 explained now in more detail with reference to the figure 6 to obtain a snapshot of the vehicle environment 12. The purpose of the merger module 30 is to increase confidence in the information provided by exploiting the redundancy of information added by the use of multiple sensors. A submodule 31 of the module 30 successively scans each list Lsi transmitted to the module 30 to produce merged data, structured in the embodiment illustrated in FIG. 6, by means of a merged list Lf which lists all the objects listed in the different Lsi lists of different sensors of index i equal to 1, 2 and beyond. Thus the number M of objects Of, j varying from 1 to M, listed in the list Lf is generally called to be greater than each number N of objects Osk, k varying from 1 to N, listed in each list of objects. synchronized Lsi 30 associated with each sensor. The merged data is initialized to the first synchronized data that is scanned by the module 30, for example those produced from the first sensor 1. In the example shown in FIG. 6, the list Lf initially reproduces the list Ls1 by pointing to sub-lists of attributes that each reproduce a sub-list of attributes pointed to by the Ls1 list. It is interesting to add in each sub-list of attributes which is pointed out by the list Lf, an additional attribute named Sde to quantify a detection score of the object Ok associated with the sub-list of attributes considered. The detection score is initialized to the unit. For each synchronized list Lsi received, the submodule 31 combines the synchronized list Lsi with the merged list Lf for example in the following manner, until reaching the last synchronized list corresponding to the last 10 sensor. A submodule 32 of the module 30 is programmed to execute process steps including those now explained with reference to FIG. 12. The submodule 31 communicates the data of the synchronized list Lsi to the submodule 32 initially in a step 318 standby waiting for the data provided by the submodule 31. In a step 319, the submodule 32 begins by placing itself on the first sub-list associated with the Osk object of index k = 1 in the Lsi list. In a step 320, the submodule 32 accesses a covariance matrix Esk linked to the object Osk of current index k on which it is placed. Considering the values xs, ys of abscissa and ordinate obtained from the sensor as realizations of random variables Xs, Ys to quantify a position of the object Osk with respect to the vehicle 12 at the instant tsyn of synchronization, it is recalled that the dispersion of the possible realizations is measurable by means of the vark (Xs), vark (Ys) variances of each random variable and covariances covk (Xs, Ys), covk (Ys, Xs) between the random variables, from which the covariance matrix Esk is defined by the following formula F1: vark (s) covk (Xs, Ys) covk (Ys, Xs) vark (Ys) Considering the random variables Xs, Ys independent, we recall that the matrix Esk can also be written in the following form F2: where a and oys respectively denote the standard deviation of the random variable Xs and the standard deviation of the random variable Ys each centered respectively on the realization xs, ys. The standard deviations or directly their squares can easily be determined from the technical characteristics of the sensor considered as they must be known to its supplier. When their values are constant, they can be stored in device memory. When their values are variable according to the visibility, the ambient temperature or other atmospheric factors detectable by the sensor, they can be transmitted in the list Lci and then transcribed in the list Lsi. It is also possible to establish, in a manner known per se, a relationship between the square of the standard deviation 02 'respectively 02y, and the measurement uncertainty δxs respectively Ely, of the sensor, for example for a normal distribution .w- centered on an uncertainty on average: 0xs X (0, 1: 32xs) respectively Elysr-- A (o, 02ys) In a step 321, the submodule 32 begins by placing itself on the first sub-list associated with the Of object; of index j = 1 in the list Lf. In a step 322, the submodule 32 accesses a covariance matrix EfJ linked to the Of object; of current index on which it is placed. [0014] The covariance matrix EfJ is defined as above for the xf, yf values of abscissa and ordinate for quantifying a representative position of the Of object; relative to the vehicle 12 at the instant tsyn synchronization. The covariance matrix EfJ can then be written in the following form F3: ## EQU1 ## where oxf and oyf respectively denote the standard deviation of the random variable Xf and the standard deviation the random variable Yf each centered respectively on the realization xf, yf. In a step 323, the submodule 32 checks whether one or more merge criteria are satisfied. A remarkable criterion is that a Mahalanobis distance d (Osk, Of;) between Osk and Of objects is less than sk (2 rr xs 0 0 0-2 Ys a hash threshold t. In this paper, the Mahalanobis distance is calculated from the xs, xf, ys, yf values of the abscissa and ordinate of the Osk and Of objects, using the following formula F4: (xs - xf 6 2 - 1 cu (Osk, Of) = xs xf 0 (xs - xf Lys-yf) 0 2 2 ys 6), f) ys - yf) If the merge criterion (s) are satisfied, especially if the Mahalanobis distance is below the hash threshold, the submodule 32 executes a step 324 which consists in modifying the object Of, by merging the object Osk with the object Of, so as to obtain new values of attributes calculated from the following way, in particular by applying the Bayes rule to the position coordinates: xf = (xf) = xf - o-2xs ± XS - 0-xf xs ± xf yf = W2 (yf) yf 6ÿS ys yf 2 0- ± Ys Yf 2 2 C 2 = (0_2 = xf - Cxs xf 3 xf 22 ± xf xs 2 2 6 - 6 yf ys cry2f ikr4 (cry2f) The other attributes of the merged Of object, modified, can be requalified by also applying the Bayes rule with the appropriate standard deviations or other way. For example, if we want to apply the Bayes rule to the velocity coordinates, it is necessary for the submodule 32 to be able to access the standard deviations of the synchronized velocity coordinates. It is also possible to requalify the velocity coordinates Vxf, Vyf of the merged object Of, by making a simple arithmetic mean of the velocity coordinates Vxs, Vys of the synchronized object Osk and velocity coordinates Vxf, Vyf of the merged object ofd. It is the same for example still for the spatial dimensions Dim of the object. Some attributes are more difficult to use for averaging such as the type attribute. For cases where the type would be different Ys ± Yf between detected objects eligible for fusion, various solutions can be considered such as retaining the type from the most reliable sensor in the field, that of cumulating types, that of proceed by vote in case of provenance of several sensors or that of considering the equality of type as merger criteria to satisfy. If in step 323 the one or one of the merge criteria is not satisfied, especially if the Mahalanobis distance is greater than or equal to the hash threshold, the submodule 32 executes a step 325 which consists in increasing the index j to move to the next sub-list associated with the Of object; in the list Lf until reaching the last sublist index j = M, controlled in a step 326. The step 326 loops back to the step 322 as j is not greater than or equal to M to verify if one or more merge criteria are satisfied to merge the synchronized Osk object of current index k to one of the merged objects of the list Lf. [0015] A value of the index j greater than M indicates that it is impossible to modify one of the Of objects; by merging the Osk object with him. Thus, a positive response to the step test 326 activates a step 327 which consists in increasing by one unit the number M of objects of the list Lf so as to add thereto a merged object Of; of index j = M which reproduces the synchronized object Osk current. [0016] Following step 324 in which the current Osk synchronized object has been merged with one of the merged objects Of, pre-existing in the list Lf or alternatively following the step 327 in which the synchronized object Osk current has been added in the list Lf so as to generate a new merged object Ofm, a step 328 consists in increasing the index k so as to place the submodule 32 on the sub-list associated with the object Osk of index k in the list Lsi by looping back on step 320 as long as the index k does not exceed the number N of synchronized objects received in the list Lsi. In the step 329 which consists in testing the existence of this overshoot, the submodule 32 loops back to the waiting step 318 in the event of a positive test response, so as to give the hand back to the module 31 to address the problem. Next Lsi list until the last of those produced from the sensors. So as not to merge the same Of object; from the list Lf with two distinct Osk objects of the list Lsi, a set C initialized to the empty set 0 in the step 319, intended to list each index j of the object Of; from the Lf list that has already been modified or created from a previous Osk object in the Lsi list. The set C is then fed with the index j examined in the step 324 or in the step 327 during the passage of the sub-module 32 by the step 328, so as to take in step 325 a next index j out of the set E. Thus, the submodule 31 outputs a complete Lf list of merged objects with their attributes as data merged with a risk assessment sub-module 33. The sub-module 33 accesses a cartographic data structure 40 for constructing risk indicators in relation to each of Ofj objects in combination with contextual data accessible from a data structure 50. The submodule 33 implements also known methods for precisely locating the vehicle 12 using a GNSS receiver, digital navigation maps accessible in the data structure 40 and algorithms for extracting geometric primitives by vision, such as those disclosed in the scientific publication "Tao, Z. and Bonnifait, P. and Fremont, V. and lbanez-Guzman, J., Lane marking aided vehicle localization, IEEE Conference on Intelligent Transportation Systems (IEEE ITSC2013), October 6-9, The Hague, The Netherlands , 2013 ". The digital processing of the navigation maps extracted from the data structure 40 enables the submodule 33 to accurately restore the vehicle's position in its geographical environment so as to locate there various resident objects sources of interactions with the vehicle 12 and with other moving objects. The submodule 33 combines the configuration of the road scene thus formed with the context so as to define a level of risk associated with each object in order to provide the navigation of high level orders. In the case of vehicle evolution, particularly in urban areas, the use of cartographic data makes it possible to enrich the semantic description of the environment by evaluating a notion of risk related to obstacles in relation to the topology, structure and descriptive of the scene in which the vehicle evolves. Ground markings, intersections, signs, and other road features are used to improve the risk indicator for each object detected in the scene. For example, a vehicle coming from the front in a straight line but staying in its lane is potentially less dangerous than a vehicle arriving on the left at an intersection. The risk assessment sub-module 33 generates the risk indicator associated with each Of object; by combining different scores each developed by a block 133, 233, 333, 433 on the model of those illustrated in Figures 7 to 10. [0017] The 7 and 8 present a development of scores based on the Cartesian position and the angular position of the obstacle as well as the cartographic data of essentially contextual nature. For example, the closer the object is and the more it enters the vehicle lane 12, the more dangerous it is. At the entrance to a junction, the distance and angle data are important to quantify the danger of the detected object, coming from the left or the right, for example. With reference to FIG. 7, the block 133 calculates a distance D which separates the object Of, and the vehicle 12. The distance D is for example calculated in the usual way from the coordinates xf, yf of the object in the reference The block 133 then compares the calculated distance D with a maximum distance Dmax obtained from the data structure 40. A distance score Sd is set to zero when the distance D is greater than the maximum distance Dmax. The distance score Sd is set at its maximum value, for example the unit, when the distance D is zero. [0018] For other values of the distance D, the block 133 calculates a score comprised between zero and the unit, for example by subtracting from the unit the proportion of the maximum distance represented by the distance D. The maximum distance Dmax is the distance beyond which there is no need to take into account risk factor related to the distance Sd score related to the detected object. The value of the maximum distance can be taken as equal to the range of the sensor, to the possible braking stroke of the vehicle 12 or to other values related to a predefined risk assessment strategy. [0019] With reference to FIG. 8, the block 233 calculates an angle e which separates a line of sight from the object Of, and the longitudinal axis of the vehicle 12. The angle e is for example calculated in the usual way from the rules conventional trigonometry applied to the xf, yf coordinates of the object in the reference relative to the vehicle 12. The block 233 then compares the calculated angle e with a maximum emax angle obtained from the data structure 40. A score of angle Is set to zero when the angle e is zero. The angle score Se is set at its maximum value, for example the unit, when the angle e is greater than the maximum angle emax. For other values of the angle e, the block 233 calculates a score between zero and unity, for example equal to the proportion of the maximum angle emax represented by the angle G. The maximum angle emax is the angle beyond which the risk factor related to the target angle sc score Se of the detected object, is considerably high. [0020] The value of the maximum angle can be taken as the angular range of the sensor, the steering capacity of the vehicle 12 or other values related to a predefined risk assessment strategy. With reference to FIG. 9, the block 333 calculates a speed V at which the object Of evolves with respect to the vehicle 12. The distance V is for example calculated in the usual manner from the velocity coordinates Vxf, Vyf of the object in the reference relative to the vehicle 12. The block 333 then compares the speed V calculated at a maximum speed Vmax obtained or not from the data structure 40. A speed score Sv is set to unity when the speed V is greater than the maximum speed Vmax. The speed score Sv is set to its minimum value, for example zero, when the speed V is zero. For other values of the speed V, the block 333 calculates a score comprised between zero and the unit, for example in proportion to the maximum speed represented by the speed V. [0021] The maximum speed Vmax is the speed beyond which the risk factor related to the speed Sv score at which the detected object moves is considerably high. The value of the maximum speed may be variable depending on its orientation to the vehicle, the possible braking stroke of the vehicle 12 or other values related to a predefined risk assessment strategy. Block 433, the operation of which is illustrated in FIG. 10, aims to develop a score based on the type of obstacle to assign a vulnerability score according to the type of obstacle encountered: a pedestrian is for example more vulnerable than a car, it must be given more importance in the risk assessment. Block 433 retrieves the typef obstruction type from the sub-list associated with the Of; object. Block 433 then goes through an associative table 50 of vulnerability scores, each assigned to a type of obstacle until it finds the type of obstacle corresponding to the type of the current object. A score of type St is set to the value So (typeo) associated in table 50 with type typeo equal to type typeF of the processed object. Block 533 allows the merge module 30 to produce an Lfe list of Of objects; as explained now with reference to Fig. 11. Block 533 applies to the scores generated by blocks 133, 233, 333, 433 a combination rule which depends on the intended use and which may for example be a weighted average of according to the needs of navigation. Other decision support formalisms such as fuzzy logic or belief functions can be used to define the combination rule to use. The result of the combination gives a risk indicator Ir that the block 533 adds in the sub-list of the Ofj object on which the Lfe.25 list then points.
权利要求:
Claims (10) [0001] REVENDICATIONS1. A device for real-time signaling of at least one object (13, 14, 15) to a vehicle navigation module (60) comprising a first sensor (1) arranged to produce first sensor data having a first position and first object captured velocity with respect to the vehicle, characterized in that it comprises - at least one second sensor (2, 3) arranged to produce second sensor data having a second position and a second speed captured object relative to the vehicle (12); a synchronization module (20) arranged to produce synchronized data comprising a first synchronized position from the first captured position and the first speed and at least a second synchronized position from the second position and the second speed; captured; a merging module (30) arranged to produce merged data having a merged position from the first synchronized position and the second synchronized position to signal said at least one object (13, 14, 15) to the module ( 60) by communicating to it all or part of the merged data. [0002] 2. Device according to claim 1, characterized in that it comprises a risk assessment module arranged to complete the merged data by a risk indicator associated with each object (13, 14, 15) reported and generated from merged data. [0003] 3. Device according to claim 2, characterized in that it comprises a set of calculation blocks (133, 233, 333, 433) from one or more merged data and a block (533) of combination of scores calculated by all or part of the set of score calculation blocks so as to generate the risk indicator of each object by combination of scores. [0004] 4. Device according to claim 3, characterized in that said set comprises a block (133) distance score calculation, a block (233) calculation angle score, a block (333) of score calculation speed and / or an object type calculation block (433). [0005] 5. Device according to one of claims 3 or 4, characterized in that said set comprises at least one block (133) for calculation of score from further cartographic data. [0006] 6. Device according to one of claims 3 to 5, characterized in that said block (533) combination is arranged to combine the scores by weighted summation of all or part of the scores calculated. [0007] 7. Device according to one of the preceding claims, characterized in that said melting module (30) is arranged to merge at least one additional position synchronized to said merged position from the first synchronized position and the second synchronized position. . [0008] 8. Device according to one of the preceding claims, characterized in that the synchronized data accessed by said melt module (30) further comprise a position variance associated with each synchronized position so as to summation positions synchronized, weighted by the associated variances to merge the synchronized positions. 25 [0009] 9. Device according to one of the preceding claims, characterized in that: the synchronized data produced by the synchronization module (20) further comprise a position variance associated with each synchronized position; the merge module (30) is arranged to produce merged data having a merged position from the first synchronized position and the second synchronized position to signal a same object to the navigation module (60) by communicating said merged data thereto when a Mahalanobis distance between said synchronized positions is less than a predetermined threshold and not producing merged data having a merged position from the first synchronized position and the second synchronized position so as to signal two different objects to the module (60) when said Mahalanobis distance is greater than the predetermined threshold. [0010] 10. Vehicle (12) automobile characterized in that it comprises a device according to one of claims 1 to 9.
类似技术:
公开号 | 公开日 | 专利标题 EP3137355B1|2021-09-01|Device for designating objects to a navigation module of a vehicle equipped with said device US10417816B2|2019-09-17|System and method for digital environment reconstruction Gruyer et al.2017|Perception, information processing and modeling: Critical stages for autonomous driving applications Jog et al.2012|Pothole properties measurement through visual 2D recognition and 3D reconstruction GB2555214A|2018-04-25|Depth map estimation with stereo images WO2021057134A1|2021-04-01|Scenario identification method and computing device US10949684B2|2021-03-16|Vehicle image verification CN110738121A|2020-01-31|front vehicle detection method and detection system US20210303877A1|2021-09-30|Systems and methods for augmenting perception data with supplemental information Breitenstein et al.2020|Systematization of Corner Cases for Visual Perception in Automated Driving EP3189389B1|2021-09-08|Location and mapping device with associated method FR2896594A1|2007-07-27|Movable and fixed elements e.g. pedestrian, perception method, involves introducing information related to elements of one series in perception system for fusion of images of environment established by vehicles CN112861833A|2021-05-28|Vehicle lane level positioning method and device, electronic equipment and computer readable medium Heidecker et al.2021|An Application-Driven Conceptualization of Corner Cases for Perception in Highly Automated Driving EP3472015A1|2019-04-24|Method for determining a reference driving class CN110909656A|2020-03-24|Pedestrian detection method and system with integration of radar and camera FR3054672B1|2019-09-13|METHOD AND SYSTEM FOR ASSOCIATING MOBILE OBJECT DETECTION AND TRACKING DATA FOR MOTOR VEHICLE Ke et al.2020|Edge Computing for Real-Time Near-Crash Detection for Smart Transportation Applications FR2699667A1|1994-06-24|Method of assisting the piloting of an aircraft flying at low altitude. Sierra-González et al.2020|Leveraging dynamic occupancy grids for 3d object detection in point clouds Kotur et al.2021|Camera and LiDAR Sensor Fusion for 3D Object Tracking in a Collision Avoidance System Yang et al.2019|Analysis of Model Optimization Strategies for a Low-Resolution Camera-Lidar Fusion Based Road Detection Network FR3112215A1|2022-01-07|System and method for detecting an obstacle in a vehicle environment Ravishankaran2021|Impact on how AI in automobile industry has affected the type approval process at RDW Georoceanu2014|Extending parking assistance for automative user interfaces
同族专利:
公开号 | 公开日 CN106255899A|2016-12-21| JP2017521745A|2017-08-03| JP6714513B2|2020-06-24| WO2015166156A1|2015-11-05| US20170043771A1|2017-02-16| US10035508B2|2018-07-31| FR3020616B1|2017-10-27| EP3137355A1|2017-03-08| EP3137355B1|2021-09-01| CN106255899B|2022-03-04|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 WO2004031877A1|2002-09-27|2004-04-15|Volkswagen Aktiengesellschaft|Method and device for determining the environment of a vehicle| US7725228B2|2004-03-03|2010-05-25|Nissan Motor Co., Ltd.|Method and system for assisting a driver of a vehicle operating a vehicle traveling on a road| DE102009006113A1|2008-03-03|2009-09-10|Volkswagen Ag|Vehicle's surrounding representation providing method, involves subjecting sensor objects to fusion to generate fusion objects, and fusing existence possibilities of fusion objects based on existence possibilities of sensor objects| EP2223838A1|2009-02-27|2010-09-01|Nissan Motor Co., Ltd.|Vehicle driving operation support apparatus/process and cooperation control| US20120035846A1|2009-04-14|2012-02-09|Hiroshi Sakamoto|External environment recognition device for vehicle and vehicle system using same| DE19641261C1|1996-10-07|1998-02-12|Siemens Ag|Distance sensor measuring quality evaluation method| US5979586A|1997-02-05|1999-11-09|Automotive Systems Laboratory, Inc.|Vehicle collision warning system| JP2005001642A|2003-04-14|2005-01-06|Fujitsu Ten Ltd|Antitheft device, monitoring device, and antitheft system| BRPI0520270B1|2005-06-01|2019-10-01|Allstate Insurance Company|EVALUATION METHOD OF AT LEAST ONE INDIVIDUAL| US20070182623A1|2006-02-03|2007-08-09|Shuqing Zeng|Method and apparatus for on-vehicle calibration and orientation of object-tracking systems| US8135605B2|2006-04-11|2012-03-13|Bank Of America Corporation|Application risk and control assessment tool| JP5345350B2|2008-07-30|2013-11-20|富士重工業株式会社|Vehicle driving support device| US8935055B2|2009-01-23|2015-01-13|Robert Bosch Gmbh|Method and apparatus for vehicle with adaptive lighting system| JP2011002355A|2009-06-19|2011-01-06|Clarion Co Ltd|Navigation device and vehicle control device| EP2562053B1|2011-08-25|2016-04-27|Volvo Car Corporation|Method, computer program product and system for determining whether it is necessary to utilize a vehicle's safety equipment and vehicle comprising these| US9747802B2|2011-09-19|2017-08-29|Innovative Wireless Technologies, Inc.|Collision avoidance system and method for an underground mine environment| FR2993376B1|2012-07-12|2014-07-25|Renault Sa|METHOD FOR CONTROLLING THE TRACK OF AN AUTONOMOUS VEHICLE WITH STEERING WHEELS| DE102012216112A1|2012-09-12|2014-03-13|Robert Bosch Gmbh|Method and information system for determining a lane change intended or not intended by the driver when driving a vehicle| JP5846109B2|2012-11-20|2016-01-20|株式会社デンソー|Collision determination device and collision avoidance system| CN103558856A|2013-11-21|2014-02-05|东南大学|Service mobile robot navigation method in dynamic environment|US10527730B2|2015-10-22|2020-01-07|Toyota Motor Engineering & Manufacturing North America, Inc.|Object detection system| DE102016000209A1|2016-01-11|2017-07-13|Trw Automotive Gmbh|A control system and method for determining a pavement irregularity| US10782393B2|2016-02-18|2020-09-22|Aeye, Inc.|Ladar receiver range measurement using distinct optical path for reference light| JP6256531B2|2016-06-10|2018-01-10|三菱電機株式会社|Object recognition processing device, object recognition processing method, and automatic driving system| JP6656601B2|2016-08-29|2020-03-04|マツダ株式会社|Vehicle control device| US10095238B2|2016-12-14|2018-10-09|Ford Global Technologies, Llc|Autonomous vehicle object detection| JP6856452B2|2017-06-14|2021-04-07|トヨタ自動車株式会社|Target judgment device and driving support system| US10656652B2|2017-08-10|2020-05-19|Patroness, LLC|System and methods for sensor integration in support of situational awareness for a motorized mobile system| US10334331B2|2017-08-25|2019-06-25|Honda Motor Co., Ltd.|System and method for synchronized vehicle sensor data acquisition processing using vehicular communication| US10757485B2|2017-08-25|2020-08-25|Honda Motor Co., Ltd.|System and method for synchronized vehicle sensor data acquisition processing using vehicular communication| US10168418B1|2017-08-25|2019-01-01|Honda Motor Co., Ltd.|System and method for avoiding sensor interference using vehicular communication| CN111344647A|2017-09-15|2020-06-26|艾耶股份有限公司|Intelligent laser radar system with low-latency motion planning update| KR20190052212A|2017-11-08|2019-05-16|현대자동차주식회사|Vehicle and control method for the same| US11104334B2|2018-05-31|2021-08-31|Tusimple, Inc.|System and method for proximate vehicle intention prediction for autonomous vehicles| AU2019278974A1|2018-05-31|2021-01-07|Tusimple, Inc.|System and method for proximate vehicle intention prediction for autonomous vehicles| US11163317B2|2018-07-31|2021-11-02|Honda Motor Co., Ltd.|System and method for shared autonomy through cooperative sensing| US11181929B2|2018-07-31|2021-11-23|Honda Motor Co., Ltd.|System and method for shared autonomy through cooperative sensing| CN110971327A|2018-09-30|2020-04-07|长城汽车股份有限公司|Time synchronization method and device for environment target| CN110376583B|2018-09-30|2021-11-19|毫末智行科技有限公司|Data fusion method and device for vehicle sensor|
法律状态:
2015-04-21| PLFP| Fee payment|Year of fee payment: 2 | 2015-11-06| PLSC| Publication of the preliminary search report|Effective date: 20151106 | 2016-04-21| PLFP| Fee payment|Year of fee payment: 3 | 2017-04-19| PLFP| Fee payment|Year of fee payment: 4 | 2018-04-20| PLFP| Fee payment|Year of fee payment: 5 | 2019-04-18| PLFP| Fee payment|Year of fee payment: 6 | 2020-04-20| PLFP| Fee payment|Year of fee payment: 7 | 2021-04-23| PLFP| Fee payment|Year of fee payment: 8 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 FR1453910A|FR3020616B1|2014-04-30|2014-04-30|DEVICE FOR SIGNALING OBJECTS TO A NAVIGATION MODULE OF A VEHICLE EQUIPPED WITH SAID DEVICE|FR1453910A| FR3020616B1|2014-04-30|2014-04-30|DEVICE FOR SIGNALING OBJECTS TO A NAVIGATION MODULE OF A VEHICLE EQUIPPED WITH SAID DEVICE| PCT/FR2015/050939| WO2015166156A1|2014-04-30|2015-04-09|Device for signalling objects to a navigation module of a vehicle equipped with this device| CN201580023677.XA| CN106255899B|2014-04-30|2015-04-09|Device for signaling an object to a navigation module of a vehicle equipped with such a device| US15/307,273| US10035508B2|2014-04-30|2015-04-09|Device for signalling objects to a navigation module of a vehicle equipped with this device| EP15725719.7A| EP3137355B1|2014-04-30|2015-04-09|Device for designating objects to a navigation module of a vehicle equipped with said device| JP2016564576A| JP6714513B2|2014-04-30|2015-04-09|An in-vehicle device that informs the navigation module of the vehicle of the presence of an object| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|